Algorithm is described in Xavier and Habier (2022). The model for the ridge regression (mrr) is as follows:
$$Y = Mu + XB + E$$
where \(Y\) is a matrix of response variables, \(Mu\) represents the intercepts, \(X\) is the matrix of genotypic information, \(B\) is the matrix of marker effects, and \(E\) is the residual matrix.
The model for the kernel regression (mkr) is as follows:
$$Y = Mu + UB + E$$
where \(Y\) is a matrix of response variables, \(Mu\) represents the intercepts, \(U\) is the matrix of Eigenvector of K, \(b\) is a vector of regression coefficients and \(E\) is the residual matrix.
Algorithm: Residuals are assumed to be independent among traits. Regression coefficients are solved via a multivaraite adaptation of Gauss-Seidel Residual Update. Since version 2.0, the solver of mrr
is based on the Randomized Gauss-Seidel algorithm. Variance and covariance components are solved with an EM-REML like approach proposed by Schaeffer called Pseudo-Expectation.
Other related implementations:
01) mkr2X(Y,K1,K2):
Solves multi-trait kernel regressions with two random effects.
02) mrr2X(Y,X1,X2):
Solves multi-trait ridge regressions with two random effects.
03) MRR3(Y,X,...):
Extension of mrr with additional parameters.
04) MRR3F(Y,X,...):
MRR3 running on float.
05) mrr_svd(Y,W):
Solves mrr through the principal components of parameters.
06) MLM(Y,X,Z,maxit=500,logtol=-8,cores=1):
Multivariate model with fixed effects.
07) SEM(Y,Z,...):
Fits a MegaSEM with both shared- and trait-specific terms.
08) MEGA(Y,X,npc=-1):
Toy implementation of MegaLMM, imputing missing with GEBVs.
09) GSEM(Y,X,npc=-1):
Toy C++ implementaiton of MegaSEM, jointly fits FA and XB.
10) ZSEMF(Y,X,npc=0):
Full-rank MegaSEM, float precision.
11) YSEMF(Y,X,npc=-1):
Reduced-rank MegaSEM, float, two-steps approach.
12) XSEMF(Y,X,npc=0):
Full-rank MegaSEM, h2 fixed at 0.5, float precision.
In GSEM, XSEMF and MEGA, 'npc' means number of latent spaces if input is above zero, otherwise, 0 means all and -1 means 2*sqrt(ncol(Y))
.